Generalized penalized least squares and its statistical characteristics
نویسندگان
چکیده
منابع مشابه
Penalized Least Squares and Penalized Likelihood
where pλ(·) is the penalty function. Best subset selection corresponds to pλ(t) = (λ/2)I(t 6= 0). If we take pλ(t) = λ|t|, then (1.2) becomes the Lasso problem (1.1). Setting pλ(t) = at + (1 − a)|t| with 0 ≤ a ≤ 1 results in the method of elastic net. With pλ(t) = |t| for some 0 < q ≤ 2, it is called bridge regression, which includes the ridge regression as a special case when q = 2. Some penal...
متن کاملPenalized least squares versus generalized least squares representations of linear mixed models
The methods in the lme4 package for R for fitting linear mixed models are based on sparse matrix methods, especially the Cholesky decomposition of sparse positive-semidefinite matrices, in a penalized least squares representation of the conditional model for the response given the random effects. The representation is similar to that in Henderson’s mixed-model equations. An alternative represen...
متن کاملLinear mixed models and penalized least squares
Linear mixed-effects models are an important class of statistical models that are not only used directly in many fields of applications but also used as iterative steps in fitting other types of mixed-effects models, such as generalized linear mixed models. The parameters in these models are typically estimated by maximum likelihood (ML) or restricted maximum likelihood (REML). In general there...
متن کاملVolterra filter identification using penalized least squares
Volterra lters have been applied to many nonlinear system identiication problems. However, obtaining good lter estimates from short and/or noisy data records is a diicult task. We propose a penalized least squares estimation algorithm and derive appropriate penalizing functionals for Volterra lters. An example demonstrates that penalized least squares estimation can provide much more accurate l...
متن کاملNonparametric regression estimation using penalized least squares
We present multivariate penalized least squares regression estimates. We use Vapnik{ Chervonenkis theory and bounds on the covering numbers to analyze convergence of the estimates. We show strong consistency of the truncated versions of the estimates without any conditions on the underlying distribution.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Geo-spatial Information Science
سال: 2006
ISSN: 1009-5020,1993-5153
DOI: 10.1007/bf02826736